داستان آبیدیک

exponential linear unit


فارسی

1 برق و الکترونیک:: واحد خطی-نمایی

The initialization strategy for the ReLU activation function (and its var‐ iants, including the ELU activation described shortly) is sometimes called He initiali‐ zation (after the last name of its author). Last but not least, a 2015 paper by Djork-Arné Clevert et al.6 proposed a new activa‐ tion function called the exponential linear unit (ELU) that outperformed all the ReLU variants in their experiments: training time was reduced and the neural network per‐ formed better on the test set. ELU activation function ELU z = α exp z − 1 if z < 0 The hyperparameter α defines the value that the ELU func‐ tion approaches when z is a large negative number.

واژگان شبکه مترجمین ایران


معنی‌های پیشنهادی کاربران

نام و نام خانوادگی
شماره تلفن همراه
متن معنی یا پیشنهاد شما
Captcha Code